Blind Source Separation Using Maximum Entropy Pdf Estimation Based on Fractional Moments
نویسندگان
چکیده
Recovering a set of independent sources which are linearly mixed is the main task of the blind source separation. Utilizing different methods such as infomax principle, mutual information and maximum likelihood leads to simple iterative procedures such as natural gradient algorithms. These algorithms depend on a nonlinear function (known as score or activation function) of source distributions. Since there is no prior knowledge of source distributions, the optimality of the algorithms is based on the choice of a suitable parametric density model. In this paper, we propose an adaptive optimal score function based on the fractional moments of the sources. In order to obtain a parametric model for the source distributions, we use a few sampled fractional moments to construct the maximum entropy probability density function (PDF) estimation . By applying an optimization method we can obtain the optimal fractional moments that best fit the source distributions. Using the fractional moments (FM) instead of the integer moments causes the maximum entropy estimated PDF to converge to the true PDF much faster . The simulation results show that unlike the most previous proposed models for the nonlinear score function, which are limited to some sorts of source families such as sub-gaussian and super-gaussian or some forms of source distribution models such as generalized gaussian distribution, our new model achieves better results for every source signal without any prior assumption for its randomness behavior.
منابع مشابه
Speech separation based on the GMM PDF estimation
In this paper, the speech separation task will be regarded as a convolutive mixture Blind Source Separation (BSS) problem. The Maximum Entropy (ME) algorithm, the Minimum Mutual Information (MMI) algorithm and the Maximum Likelihood (ML) algorithm are main approaches of the algorithms solving the BSS problem. The relationship of these three algorithms has been analyzed in this paper. Based on t...
متن کاملMinimum Entropy Algorithms for Source Separation
The minimum entropy or maximum likelihood estimation can be utilized in blind source separation problem. Based on the local generalized Gaussian probability density model, a set of general anti-Hebbian rule can be derived. This set of adaptation rules give promising results when we test the real recordings.
متن کاملLearning from Examples with Quadratic Mutual Information
This paper discusses a novel algorithm to train nonlinear mappers with information theoretic criteria (entropy or mutual information) directly from a training set. l’he method is based on a Parzen window estimator and uses Renyi’s quadratic definition of entropy and a distance measure based on the Cauchy-Schwartz inequality. We apply the algorithm to the difficult problem of vehicle pose estima...
متن کاملSobre separação cega de fontes: proposições e analise de estrategias para processamento multi-usuario
This thesis is devoted to study blind source separation techniques applied to multiuser processing in digital communications. Using probability density function (pdf) estimation strategies, two multiuser processing methods are proposed. They aim for recovering transmitted signal by using the Kullback-Leibler similarity measure between the signals pdf and a parametric model that contains the sig...
متن کاملCalculation of Leakage in Water Supply Network Based on Blind Source Separation Theory
The economic and environmental losses due to serious leakage in the urban water supply network have increased the effort to control the water leakage. However, current methods for leakage estimation are inaccurate leading to the development of ineffective leakage controls. Therefore, this study proposes a method based on the blind source separation theory (BSS) to calculate the leakage of water...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006